Model selection and sparse recovery are two important problems for which manyregularization methods have been proposed. We study the properties ofregularization methods in both problems under the unified framework ofregularized least squares with concave penalties. For model selection, weestablish conditions under which a regularized least squares estimator enjoys anonasymptotic property, called the weak oracle property, where thedimensionality can grow exponentially with sample size. For sparse recovery, wepresent a sufficient condition that ensures the recoverability of the sparsestsolution. In particular, we approach both problems by considering a family ofpenalties that give a smooth homotopy between $L_0$ and $L_1$ penalties. Wealso propose the sequentially and iteratively reweighted squares (SIRS)algorithm for sparse recovery. Numerical studies support our theoreticalresults and demonstrate the advantage of our new methods for model selectionand sparse recovery.
展开▼